Corelab Seminar
2011-2012
Nikolaos Leonardos
Information theory methods in communication complexity
Abstract (of the PhD Thesis).
This thesis is concerned with the application of notions and
methods from the field of information theory to the field of
communication complexity. It consists of two main parts.
In the first part of the thesis, we prove lower bounds on the
randomized two-party communication complexity of functions that
arise from read-once boolean formulae.
A read-once boolean formula is a formula in propositional logic
with the property that every variable appears exactly once. Such
a formula can be represented by a tree, where the leaves
correspond to variables, and the internal nodes are labeled by
binary connectives. Under certain assumptions, this
representation is unique. Thus, one can define the depth of a
formula as the depth of the tree that represents it.
The complexity of the evaluation of general read-once formulae
has attracted interest mainly in the decision tree model. In the
communication complexity model many interesting results deal
with specific read-once formulae, such as disjointness and
tribes. In this thesis we use information theory methods to
prove lower bounds that hold for any read-once formula. Our
lower bounds are of the form n(f)/c^{d(f)}, where n(f) is the
number of variables and d(f) is the depth of the formula, and
they are optimal up to the constant in the base of the
denominator.
In the second part of the thesis, we explore the applicability
of the information-theoretic method in the
number-on-the-forehead communication complexity model. The work
of Bar-Yossef, Jayram, Kumar & Sivakumar revealed a beautiful
connection between Hellinger distance and two-party randomized
communication protocols. Inspired by their work and motivated by
the open questions in the number-on-the-forehead model, we
introduce the notion of Hellinger volume. We show that it lower
bounds the information cost of multi-party protocols. We provide
a small toolbox that allows one to manipulate several Hellinger
volume terms and also to lower bound a Hellinger volume when the
distributions involved satisfy certain conditions. In doing so,
we prove a new upper bound on the difference between the
arithmetic mean and the geometric mean in terms of relative
entropy. Finally, we show how to apply the new tools to obtain
a lower bound on the informational complexity of the AND_k
function.